39 research outputs found

    Comparison of Four Control Methods for a Five-Choice Assistive Technology

    Get PDF
    Severe motor impairments can affect the ability to communicate. The ability to see has a decisive influence on the augmentative and alternative communication (AAC) systems available to the user. To better understand the initial impressions users have of AAC systems we asked naïve healthy participants to compare two visual (a visual P300 brain-computer interface (BCI) and an eye-tracker) and two non-visual systems (an auditory and a tactile P300 BCI). Eleven healthy participants performed 20 selections in a five choice task with each system. The visual P300 BCI used face stimuli, the auditory P300 BCI used Japanese Hiragana syllables and the tactile P300 BCI used a stimulator on the small left finger, middle left finger, right thumb, middle right finger and small right finger. The eye-tracker required a dwell time of 3 s on the target for selection. We calculated accuracies and information-transfer rates (ITRs) for each control method using the selection time that yielded the highest ITR and an accuracy above 70% for each system. Accuracies of 88% were achieved with the visual P300 BCI (4.8 s selection time, 20.9 bits/min), of 70% with the auditory BCI (19.9 s, 3.3 bits/min), of 71% with the tactile BCI (18 s, 3.4 bits/min) and of 100% with the eye-tracker (5.1 s, 28.2 bits/min). Performance between eye-tracker and visual BCI correlated strongly, correlation between tactile and auditory BCI performance was lower. Our data showed no advantage for either non-visual system in terms of ITR but a lower correlation of performance which suggests that choosing the system which suits a particular user is of higher importance for non-visual systems than visual systems

    Task-Related c-Fos Expression in the Posterior Parietal Cortex During the “Rubber Tail Task” Is Diminished in Ca2+-Dependent Activator Protein for Secretion 2 (Caps2)-Knockout Mice

    Get PDF
    Rubber hand illusion (RHI), a kind of body ownership illusion, is sometimes atypical in individuals with autism spectrum disorder; however, the brain regions associated with the illusion are still unclear. We previously reported that mice responded as if their own tails were being touched when rubber tails were grasped following synchronous stroking to rubber tails and their tails (a “rubber tail illusion”, RTI), which is a task based on the human RHI; furthermore, we reported that the RTI response was diminished in Ca2+-dependent activator protein for secretion 2-knockout (Caps2-KO) mice that exhibit autistic-like phenotypes. Importance of the posterior parietal cortex in the formation of illusory perception has previously been reported in human imaging studies. However, the local neural circuits and cell properties associated with this process are not clear. Therefore, we aimed to elucidate the neural basis of the RTI response and its impairment by investigating the c-Fos expression in both wild-type (WT) and Caps2-KO mice during the task since the c-Fos expression occurred soon after the neural activation. Immediately following the delivery of the synchronous stroking to both rubber tails and actual tails, the mice were perfused. Subsequently, whole brains were cryo-sectioned, and each section was immunostained with anti-c-Fos antibody; finally, c-Fos positive cell densities among the groups were compared. The c-Fos expression in the posterior parietal cortex was significantly lower in the Caps2-KO mice than in the WT mice. Additionally, we compared the c-Fos expression in the WT mice between synchronous and asynchronous conditions and found that the c-Fos-positive cell densities were significantly higher in the claustrum and primary somatosensory cortex of the WT mice exposed to the synchronous condition than those exposed to the asynchronous condition. Hence, the results suggest that decreased c-Fos expression in the posterior parietal cortex may be related to impaired multisensory integrations in Caps2-KO mice

    Affective Stimuli for an Auditory P300 Brain-Computer Interface

    Get PDF
    Gaze-independent brain computer interfaces (BCIs) are a potential communication tool for persons with paralysis. This study applies affective auditory stimuli to investigate their effects using a P300 BCI. Fifteen able-bodied participants operated the P300 BCI, with positive and negative affective sounds (PA: a meowing cat sound, NA: a screaming cat sound). Permuted stimuli of the positive and negative affective sounds (permuted-PA, permuted-NA) were also used for comparison. Electroencephalography data was collected, and offline classification accuracies were compared. We used a visual analog scale (VAS) to measure positive and negative affective feelings in the participants. The mean classification accuracies were 84.7% for PA and 67.3% for permuted-PA, while the VAS scores were 58.5 for PA and −12.1 for permuted-PA. The positive affective stimulus showed significantly higher accuracy and VAS scores than the negative affective stimulus. In contrast, mean classification accuracies were 77.3% for NA and 76.0% for permuted-NA, while the VAS scores were −50.0 for NA and −39.2 for permuted NA, which are not significantly different. We determined that a positive affective stimulus with accompanying positive affective feelings significantly improved BCI accuracy. Additionally, an ALS patient achieved 90% online classification accuracy. These results suggest that affective stimuli may be useful for preparing a practical auditory BCI system for patients with disabilities

    Spatio-Temporal Updating in the Left Posterior Parietal Cortex

    Get PDF
    Adopting an unusual posture can sometimes give rise to paradoxical experiences. For example, the subjective ordering of successive unseen tactile stimuli delivered to the two arms can be affected when people cross them. A growing body of evidence now highlights the role played by the parietal cortex in spatio-temporal information processing when sensory stimuli are delivered to the body or when actions are executed; however, little is known about the neural basis of such paradoxical feelings resulting from such unusual limb positions. Here, we demonstrate increased fMRI activation in the left posterior parietal cortex when human participants adopted a crossed hands posture with their eyes closed. Furthermore, by assessing tactile temporal order judgments (TOJs) in the same individuals, we observed a positive association between activity in this area and the degree of reversal in TOJs resulting from crossing arms. The strongest positive association was observed in the left intraparietal sulcus. This result implies that the left posterior parietal cortex may be critically involved in monitoring limb position and in spatio-temporal binding when serial events are delivered to the limbs

    Neural Correlates of Attitude Change Following Positive and Negative Advertisements

    Get PDF
    Understanding changes in attitudes towards others is critical to understanding human behaviour. Neuropolitical studies have found that the activation of emotion-related areas in the brain is linked to resilient political preferences, and neuroeconomic research has analysed the neural correlates of social preferences that favour or oppose consideration of intrinsic rewards. This study aims to identify the neural correlates in the prefrontal cortices of changes in political attitudes toward others that are linked to social cognition. Functional magnetic resonance imaging (fMRI) experiments have presented videos from previous electoral campaigns and television commercials for major cola brands and then used the subjects' self-rated affinity toward political candidates as behavioural indicators. After viewing negative campaign videos, subjects showing stronger fMRI activation in the dorsolateral prefrontal cortex lowered their ratings of the candidate they originally supported more than did those with smaller fMRI signal changes in the same region. Subjects showing stronger activation in the medial prefrontal cortex tended to increase their ratings more than did those with less activation. The same regions were not activated by viewing negative advertisements for cola. Correlations between the self-rated values and the neural signal changes underscore the metric representation of observed decisions (i.e., whether to support or not) in the brain. This indicates that neurometric analysis may contribute to the exploration of the neural correlates of daily social behaviour

    An evaluation of training with an auditory P300 brain-computer interface for the Japanese Hiragana syllabary

    Get PDF
    Gaze-independent brain-computer interfaces (BCIs) are a possible communication channel for persons with paralysis. We investigated if it is possible to use auditory stimuli to create a BCI for the Japanese Hiragana syllabary, which has 46 Hiragana characters. Additionally, we investigated if training has an effect on accuracy despite the high amount of different stimuli involved. Able-bodied participants (N = 6) were asked to select 25 syllables (out of fifty possible choices) using a two step procedure: First the consonant (ten choices) and then the vowel (five choices). This was repeated on 3 separate days. Additionally, a person with spinal cord injury (SCI) participated in the experiment. Four out of six healthy participants reached Hiragana syllable accuracies above 70% and the information transfer rate increased from 1.7 bits/min in the first session to 3.2 bits/min in the third session. The accuracy of the participant with SCI increased from 12% (0.2 bits/min) to 56% (2 bits/min) in session three. Reliable selections from a 10 × 5 matrix using auditory stimuli were possible and performance is increased by training. We were able to show that auditory P300 BCIs can be used for communication with up to fifty symbols. This enables the use of the technology of auditory P300 BCIs with a variety of applications

    Event-Related Desynchronization and Corticomuscular Coherence Observed During Volitional Swallow by Electroencephalography Recordings in Humans

    Get PDF
    Swallowing in humans involves many cortical areas although it is partly mediated by a series of brainstem reflexes. Cortical motor commands are sent to muscles during swallow. Previous works using magnetoencephalography showed event-related desynchronization (ERD) during swallow and corticomuscular coherence (CMC) during tongue movements in the bilateral sensorimotor and motor-related areas. However, there have been few analogous works that use electroencephalography (EEG). We investigated the ERD and CMC in the bilateral sensorimotor, premotor, and inferior prefrontal areas during volitional swallow by EEG recordings in 18 healthy human subjects. As a result, we found a significant ERD in the beta frequency band and CMC in the theta, alpha, and beta frequency bands during swallow in those cortical areas. These results suggest that EEG can detect the desynchronized activity and oscillatory interaction between the cortex and pharyngeal muscles in the bilateral sensorimotor, premotor, and inferior prefrontal areas during volitional swallow in humans

    Case Report: Event-Related Desynchronization Observed During Volitional Swallow by Electroencephalography Recordings in ALS Patients With Dysphagia

    Get PDF
    Dysphagia is a severe disability affecting daily life in patients with amyotrophic lateral sclerosis (ALS). It is caused by degeneration of both the bulbar motor neurons and cortical motoneurons projecting to the oropharyngeal areas. A previous report showed decreased event-related desynchronization (ERD) in the medial sensorimotor areas in ALS dysphagic patients. In the process of degeneration, brain reorganization may also be induced in other areas than the sensorimotor cortices. Furthermore, ALS patients with dysphagia often show a longer duration of swallowing. However, there have been no reports on brain activity in other cortical areas and the time course of brain activity during prolonged swallowing in these patients. In this case report, we investigated the distribution and the time course of ERD and corticomuscular coherence (CMC) in the beta (15-25 Hz) frequency band during volitional swallow using electroencephalography (EEG) in two patients with ALS. Case 1 (a 71-year-old man) was diagnosed 2 years before the evaluation. His first symptom was muscle weakness in the right hand; 5 months later, dysphagia developed and exacerbated. Since his dietary intake decreased, he was given an implantable venous access port. Case 2 (a 64-year-old woman) was diagnosed 1 year before the evaluation. Her first symptom was open-nasal voice and dysarthria; 3 months later, dysphagia developed and exacerbated. She was given a percutaneous endoscopic gastrostomy. EEG recordings were performed during volitional swallowing, and the ERD was calculated. The average swallow durations were 7.6 ± 3.0 s in Case 1 and 8.3 ± 2.9 s in Case 2. The significant ERD was localized in the prefrontal and premotor areas and lasted from a few seconds after the initiation of swallowing to the end in Case 1. The ERD was localized in the lateral sensorimotor areas only at the initiation of swallowing in Case 2. CMC was not observed in either case. These results suggest that compensatory processes for cortical motor outputs might depend on individual patients and that a new therapeutic approach using ERD should be developed according to the individuality of ALS patients with dysphagia
    corecore